Matrix Eigen-decomposition via Doubly Stochastic Riemannian Optimization: Supplementary Material

نویسندگان

  • Zhiqiang Xu
  • Peilin Zhao
  • Jianneng Cao
  • Xiaoli Li
چکیده

Preparation First, based on the definitions of A t , Y t , ˜ Z t and Z t , we can write g t = G(s t , r t , X t) = p −1 st p −1 rt (I − X t X ⊤ t)(E st ⊙ A)(E ·rt ⊙ X) = (I − X t X ⊤ t)A t Y t. Then from (6), we have X t+1 = X t + α t g t W t − α 2 t 2 X t g ⊤ t g t W t. Since W t = (I + α 2 t 4 g ⊤ t g t) −1 = I − α 2 t 4 g ⊤ t g t + O(α 4 t), we get X t+1 = X t + α t A t Y t − α t X t X ⊤ t A t Y t − α 2 t 2 X t g ⊤ t g t − O(α 3 t). Let F t be the set of all the random variables seen thus far 1 (i.e., from 0 to t). Proof. The proof technique follows (Balsubramani et al., 2013) and (Xie et al., 2015). Note that for two square matrices Q i , i = 1, 2, their products Q 1 Q 2 and Q 2 Q 1 have the same spectrum. The spectral norm (i.e., matrix 2-norm) is orthogonal invariant. Hence, we can write λ min (Z ⊤ t VV ⊤ Z t) = λ min (V ⊤ Z t Z ⊤ t V) = min y̸ =0 ∥V ⊤ Z t y∥ 2 2 ∥y∥ 2 2 = min y̸ =0 ∥V ⊤ Z t y∥ 2 2 ∥Z t y∥ 2 2 = min y̸ =0 ∥V ⊤ (X t + α t A t Y t)y∥ 2 2 ∥(X t + α t A t Y t)y∥ 2 2. 1 Mathematically, it's known as a filtration, i.e., sub-sigma algebras such that Ft ⊂ Ft+1.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Matrix Eigen-decomposition via Doubly Stochastic Riemannian Optimization

Matrix eigen-decomposition is a classic and long-standing problem that plays a fundamental role in scientific computing and machine learning. Despite some existing algorithms for this inherently non-convex problem, the study remains inadequate for the need of large data nowadays. To address this gap, we propose a Doubly Stochastic Riemannian Gradient EIGenSolver, DSRG-EIGS, where the double sto...

متن کامل

A Fast Algorithm for Matrix Eigen-decompositionn

We propose a fast stochastic Riemannian gradient eigensolver for a real and symmetric matrix, and prove its local, eigengap-dependent and linear convergence. The fast convergence is brought by deploying the variance reduction technique which was originally developed for the Euclidean strongly convex problems. In this paper, this technique is generalized to Riemannian manifolds for solving the g...

متن کامل

Low-rank tensor completion: a Riemannian manifold preconditioning approach Supplementary material

A Proof and derivation of manifold-related ingredients The concrete computations of the optimization-related ingredients presented in the paper are discussed below. The total space is M := St(r1, n1) × St(r2, n2) × St(r3, n3) × Rr1×r2×r3 . Each element x ∈ M has the matrix representation (U1,U2,U3,G). Invariance of Tucker decomposition under the transformation (U1,U2,U3,G) 7→ (U1O1,U2O2,U3O3,G×...

متن کامل

Low-rank tensor completion: a Riemannian manifold preconditioning approach

We propose a novel Riemannian manifold preconditioning approach for the tensor completion problem with rank constraint. A novel Riemannian metric or inner product is proposed that exploits the least-squares structure of the cost function and takes into account the structured symmetry that exists in Tucker decomposition. The specific metric allows to use the versatile framework of Riemannian opt...

متن کامل

Riemannian Optimization for Skip-Gram Negative Sampling

Skip-Gram Negative Sampling (SGNS) word embedding model, well known by its implementation in “word2vec” software, is usually optimized by stochastic gradient descent. However, the optimization of SGNS objective can be viewed as a problem of searching for a good matrix with the low-rank constraint. The most standard way to solve this type of problems is to apply Riemannian optimization framework...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2016